The app that strips people has been blocked by the Guarantor: "False photos without warning or consent."

ROME – The Italian Data Protection Authority has urgently ordered, with immediate effect, a temporary restriction on the processing of personal data of Italian users by a company based in the British Virgin Islands that operates the Clothoff app , an application that strips people of their clothes.
"Clothoff offers a generative AI service that makes it possible, free of charge and for a fee, to generate deep nude images, i.e. fake photos and videos that portray real people in nude or sexually explicit or even pornographic poses", explains the Guarantor.
"The application allows anyone—including minors —to create photographs and videos from images, even those of minors, without any means of verifying the consent of the person portrayed and without any warnings regarding the artificial nature of the photos and videos," the Authority notes.
The block, followed by an investigative activity aimed at combating all apps of the same type , was made necessary due to the high risks that such services can pose to fundamental rights and freedoms, with particular regard to the protection of personal dignity , the rights to privacy and the protection of personal data of the subjects involved in these types of processing, especially if they involve minors.
This is confirmed by numerous recent Italian news stories, which show how the abuse of such artificial images is now generating a real social alarm .
La Repubblica